🇬🇧 en ru 🇷🇺

Markov chain noun

  • (probability theory) A discrete-time stochastic process containing a Markov property.
цепь Ма́ркова
Wiktionary Links